NO H1b Candidate required
Job Description: Data Engineer
Responsibilities
- Design, build, and maintain scalable, reliable data pipelines and systems to support data ingestion, transformation, and storage.
- Develop and optimize ETL (Extract, Transform, Load) processes to handle large datasets efficiently.
- Collaborate with data scientists, analysts, and software engineers to implement robust data solutions.
- Implement and manage databases, data warehouses, and data lakes (e.g., Redshift, Snowflake, BigQuery).
- Create and maintain data architecture documentation and workflows.
- Monitor and troubleshoot performance and data-related issues.
- Ensure data quality and governance by implementing validation mechanisms and compliance standards.
- Work with cloud-based platforms (AWS, Azure, GCP) for data processing and storage.
- Drive continuous improvement in data engineering practices and processes.
Qualifications
Required Skills
- Bachelor’s degree in Computer Science, Engineering, or a related field (or equivalent work experience).
- 3+ years of experience as a Data Engineer or in a similar role.
- Strong experience with SQL and relational databases.
- Proficiency in programming languages such as Python, Java, or Scala.
- Experience building and optimizing ETL pipelines and workflows.
- Familiarity with big data technologies like Hadoop, Spark, or Kafka.
- Knowledge of cloud-based data services (e.g., AWS Glue, Azure Data Factory, Google Cloud Dataflow).
- Strong understanding of data modeling, data warehousing, and data lakes.
- Hands-on experience with data visualization tools (e.g., Tableau, Power BI).
Preferred Skills
- Experience with containerization and orchestration tools like Docker and Kubernetes.
- Familiarity with real-time data processing.
- Experience with CI/CD pipelines for data engineering.
- Knowledge of data governance and compliance (e.g., GDPR, CCPA).
- Certification in cloud technologies like AWS Certified Data Analytics, Azure Data Engineer, or GCP Data Engineer.